Quantum Logic of Word Meanings: Concept Lattices in Vector Space Models
نویسندگان
چکیده
This paper systematically develops the logical and algebraic possibilities inherent in vector space models for language, considerably beyond those which are customarily used in semantic applications such as information retrieval and word sense disambiguation. The cornerstone of the approach lies in a simple implementation of the connectives of quantum logic as introduced by Birkhoff and von Neumann (1936), which defines the negation of a concept as the projection onto its orthogonal subspace, and the disjunction and conjunction of two concepts as the vector sum and intersection of their subspaces. This enables us to use the full lattice structure of a vector space, bringing these models much closer to traditional semantic lattice representations such as taxonomic concept hierarchies. We describe selected examples of this process with both negation and disjunction, and summarise experiments which show that the non-local nature of these connectives has clear advantages over their Boolean counterparts in removing the synonyms and neighbours of negated terms in information retrieval, as well as removing the negated terms themselves. Having thus validated the approach, we explore its implications for assigning semantics to some compositional phrases, showing cases where a quantum interpretation is preferable to a traditional Boolean formulation (and vice versa). Finally, we draw attention to the danger that quantum connectives may overgeneralise, and suggest another (also non-Boolean) alternative.
منابع مشابه
From Logical to Distributional Models
The paper relates two variants of semantic models for natural language, logical functional models and compositional distributional vector space models, by transferring the logic and reasoning from the logical to the distributional models. The geometrical operations of quantum logic are reformulated as algebraic operations on vectors. A map from functional models to vector space models makes it ...
متن کاملOrthogonal Negation in Vector Spaces for Modelling Word-Meanings and Document Retrieval
Standard IR systems can process queries such as “web NOT internet”, enabling users who are interested in arachnids to avoid documents about computing. The documents retrieved for such a query should be irrelevant to the negated query term. Most systems implement this by reprocessing results after retrieval to remove documents containing the unwanted string of letters. This paper describes and e...
متن کاملQuantum Logic Unites Compositional Functional Semantics and Distributional Semantic Models
When we retrieve information from text by statistical methods, we apply these methods not to random strings of words but to sentences, paragraphs etc. They are ruled by laws of logic inherent to language. Natural language conveys information about individuals (extension) using concepts (intension). The extensional aspect is captured by the familiar logical models, the intensional aspect by dist...
متن کاملA quantum teleportation inspired algorithm produces sentence meaning from word meaning and grammatical structure
We discuss an algorithm which produces the meaning of a sentence given meanings of its words, and its resemblance to quantum teleportation. In fact, this protocol was the main source of inspiration for this algorithm which has many applications in the area of Natural Language Processing. Quantum teleportation (Bennett et al., 1993) is one of the most conceptually challenging and practically use...
متن کاملNatural language semantics in biproduct dagger categories
Biproduct dagger categories serve as models for natural language. In particular, the biproduct dagger category of finite dimensional vector spaces over the field of real numbers accommodates both the extensional models of predicate calculus and the intensional models of quantum logic. The morphisms representing the extensional meanings of a grammatical string are translated to morphisms represe...
متن کامل